Time Integration of Rank-Constrained Tucker Tensors
نویسندگان
چکیده
منابع مشابه
Recompression of Hadamard Products of Tensors in Tucker Format
The Hadamard product features prominently in tensor-based algorithms in scientific computing and data analysis. Due to its tendency to significantly increase ranks, the Hadamard product can represent a major computational obstacle in algorithms based on low-rank tensor representations. It is therefore of interest to develop recompression techniques that mitigate the effects of this rank increas...
متن کاملDynamical approximation of hierarchical Tucker and tensor-train tensors
We extend results on the dynamical low-rank approximation for the treatment of time-dependent matrices and tensors (Koch & Lubich, 2007 and 2010) to the recently proposed Hierarchical Tucker tensor format (HT, Hackbusch & Kühn, 2009) and the Tensor Train format (TT, Oseledets, 2011), which are closely related to tensor decomposition methods used in quantum physics and chemistry. In this dynamic...
متن کاملBlack Box Approximation of Tensors in Hierarchical Tucker Format
We derive and analyse a scheme for the approximation of order d tensors A ∈ R n×···×n in the hierarchical (H-) Tucker format, a dimension-multilevel variant of the Tucker format and strongly related to the TT format. For a fixed rank parameter k, the storage complexity of a tensor in H-Tucker format is O(dk 3 + dnk) and we present a (heuristic) algorithm that finds an approximation to a tensor ...
متن کاملhtucker – A Matlab toolbox for tensors in hierarchical Tucker format
The hierarchical Tucker format is a storage-efficient scheme to approximate and represent tensors of possibly high order. This paper presents a Matlab toolbox, along with the underlying methodology and algorithms, which provides a convenient way to work with this format. The toolbox not only allows for the efficient storage and manipulation of tensors but also offers a set of tools for the deve...
متن کاملScalable Tucker Factorization for Sparse Tensors - Algorithms and Discoveries
Given sparse multi-dimensional data (e.g., (user, movie, time; rating) for movie recommendations), how can we discover latent concepts/relations and predict missing values? Tucker factorization has been widely used to solve such problems with multi-dimensional data, which are modeled as tensors. However, most Tucker factorization algorithms regard and estimate missing entries as zeros, which tr...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: SIAM Journal on Numerical Analysis
سال: 2018
ISSN: 0036-1429,1095-7170
DOI: 10.1137/17m1146889